πŸ““ Artificial Intelligence/Introduction to AI/Week 3 - Introduction/Definitions/Rectified_Linear_Unit_(Relu).md by @KGBicheno β˜†

Rectified Linear Unit (ReLU)

Go back to the [[AI Glossary]]

An activation function with the following rules:

If input is negative or zero, output is 0.
If input is positive, output is equal to input.